Procedure for the Selection of Principal Components in Principal Components Regression

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric Principal Components Regression

In ordinary least squares regression, dimensionality is a sensitive issue. As the number of independent variables approaches the sample size, the least squares algorithm could easily fail, i.e., estimates are not unique or very unstable, (Draper and Smith, 1981). There are several problems usually encountered in modeling high dimensional data, including the difficulty of visualizing the data, s...

متن کامل

Principal Components Regression With Data Chosen Components and Related Methods

Multiple regression with correlated predictor variables is relevant to a broad range of problems in the physical, chemical, and engineering sciences. Chemometricians, in particular, have made heavy use of principal components regression and related procedures for predicting a response variable from a large number of highly correlated predictors. In this paper we develop a general theory that gu...

متن کامل

Estimating Invariant Principal Components Using Diagonal Regression

In this work we apply the method of diagonal regression to derive an alternative version of Principal Component Analysis (PCA). “Diagonal regression” was introduced by Ragnar Frisch (the first economics Nobel laureate) in his paper “Correlation and Scatter in Statistical Variables” (1928). The benefits of using diagonal regression in PCA are that it provides components that are scale-invariant ...

متن کامل

Detecting influential observations in principal components and common principal components

Detecting outlying observations is an important step in any analysis, even when robust estimates are used. In particular, the robustified Mahalanobis distance is a natural measure of outlyingness if one focuses on ellipsoidal distributions. However, it is well known that the asymptotic chi-square approximation for the cutoff value of the Mahalanobis distance based on several robust estimates (l...

متن کامل

Unsupervised feature selection using weighted principal components

Feature selection has received considerable attention in various areas as a way to select informative features and to simplify the statistical model through dimensional reduction. One of the most widely used methods for dimensional reduction includes principal component analysis (PCA). Despite its popularity, PCA suffers from a lack of interpretability of the original feature because the reduce...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Korean Journal of Applied Statistics

سال: 2010

ISSN: 1225-066X

DOI: 10.5351/kjas.2010.23.5.967